Nonparametric estimation via empirical risk minimization

نویسندگان

  • Gábor Lugosi
  • Kenneth Zeger
چکیده

A general notion of universal consistency of nonparametric estimators is introduced that applies to regression estimation, conditional median estimation, curve fitting, pattern recognition, and learning concepts. General methods for proving consistency of estimators based on minimizing the empirical error are shown. In particular, distribution-free almost sure consistency of neural network estimates and generalized linear estimators is established.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric estimation and classification using radial basis function nets and empirical risk minimization

Studies convergence properties of radial basis function (RBF) networks for a large class of basis functions, and reviews the methods and results related to this topic. The authors obtain the network parameters through empirical risk minimization. The authors show the optimal nets to be consistent in the problem of nonlinear function approximation and in nonparametric classification. For the cla...

متن کامل

Multivariate Dyadic Regression Trees for Sparse Learning Problems

We propose a new nonparametric learning method based on multivariate dyadic regression trees (MDRTs). Unlike traditional dyadic decision trees (DDTs) or classification and regression trees (CARTs), MDRTs are constructed using penalized empirical risk minimization with a novel sparsity-inducing penalty. Theoretically, we show that MDRTs can simultaneously adapt to the unknown sparsity and smooth...

متن کامل

A Robust Information Clustering Algorithm

We focus on the scenario of robust information clustering (RIC) based on the minimax optimization of mutual information (MI). The minimization of MI leads to the standard mass-constrained deterministic annealing clustering, which is an empirical risk-minimization algorithm. The maximization of MI works out an upper bound of the empirical risk via the identification of outliers (noisy data point...

متن کامل

Principles of Risk Minimization for Learning Theory

Learning is posed as a problem of function estimation, for which two principles of solution are considered: empirical risk minimization and structural risk minimization. These two principles are applied to two different statements of the function estimation problem: global and local. Systematic improvements in prediction power are illustrated in application to zip-code recognition.

متن کامل

Nonparametric Estimation of Spatial Risk for a Mean Nonstationary Random Field}

The common methods for spatial risk estimation are investigated for a stationary random field. Because of simplifying, lets distribution is known, and parametric variogram for the random field are considered. In this paper, we study a nonparametric spatial method for spatial risk. In this method, we model the random field trend by a local linear estimator, and through bias-corrected residuals, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 41  شماره 

صفحات  -

تاریخ انتشار 1995